human robot interaction
Sound Judgment: Properties of Consequential Sounds Affecting Human-Perception of Robots
Allen, Aimee, Drummond, Tom, Kulić, Dana
Positive human-perception of robots is critical to achieving sustained use of robots in shared environments. One key factor affecting human-perception of robots are their sounds, especially the consequential sounds which robots (as machines) must produce as they operate. This paper explores qualitative responses from 182 participants to gain insight into human-perception of robot consequential sounds. Participants viewed videos of different robots performing their typical movements, and responded to an online survey regarding their perceptions of robots and the sounds they produce. Topic analysis was used to identify common properties of robot consequential sounds that participants expressed liking, disliking, wanting or wanting to avoid being produced by robots. Alongside expected reports of disliking high pitched and loud sounds, many participants preferred informative and audible sounds (over no sound) to provide predictability of purpose and trajectory of the robot. Rhythmic sounds were preferred over acute or continuous sounds, and many participants wanted more natural sounds (such as wind or cat purrs) in-place of machine-like noise. The results presented in this paper support future research on methods to improve consequential sounds produced by robots by highlighting features of sounds that cause negative perceptions, and providing insights into sound profile changes for improvement of human-perception of robots, thus enhancing human robot interaction.
- Research Report > Experimental Study (1.00)
- Questionnaire & Opinion Survey (0.89)
- Research Report > New Finding (0.68)
Proceedings of the AI-HRI Symposium at AAAI-FSS 2022
Han, Zhao, Senft, Emmanuel, Ahmad, Muneeb I., Bagchi, Shelly, Yazdani, Amir, Wilson, Jason R., Kim, Boyoung, Wen, Ruchen, Hart, Justin W., García, Daniel Hernández, Leonetti, Matteo, Mead, Ross, Mirsky, Reuth, Prabhakar, Ahalya, Zimmerman, Megan L.
The Artificial Intelligence (AI) for Human-Robot Interaction (HRI) Symposium has been a successful venue of discussion and collaboration on AI theory and methods aimed at HRI since 2014. This year, after a review of the achievements of the AI-HRI community over the last decade in 2021, we are focusing on a visionary theme: exploring the future of AI-HRI. Accordingly, we added a Blue Sky Ideas track to foster a forward-thinking discussion on future research at the intersection of AI and HRI. As always, we appreciate all contributions related to any topic on AI/HRI and welcome new researchers who wish to take part in this growing community. With the success of past symposia, AI-HRI impacts a variety of communities and problems, and has pioneered the discussions in recent trends and interests. This year's AI-HRI Fall Symposium aims to bring together researchers and practitioners from around the globe, representing a number of university, government, and industry laboratories. In doing so, we hope to accelerate research in the field, support technology transition and user adoption, and determine future directions for our group and our research.
Autonomous Vehicles – Do We Really Know The Risks? – Human Robot Interaction
Autonomous Vehicles (AV) are the riskiest form of human-robot interaction. One the one hand they offer unparalleled improvements to the safety and comfort of drivers, passengers and other traffic participants. They also promise to reduce emission. On the other hand, they demand new considerations for trust and responsibilities in human-robot interaction. The field of tension between autonomy, trust and liability can only be manoeuvred on the basis of objective data.
Good egg? Robot chef is trained to make the 'perfect' omelette
A robot has been trained to prepare and cook an omelette from breaking the egg to presenting it on a plate to the diner by a team of engineers. Researchers from the University of Cambridge worked with domestic appliance firm Beko to train the machine to create the best omelette for the majority of tastes. The team say cooking is an interesting problem for roboticists as'humans can never be totally objective when it comes to food' or how it should taste. They used machine learning data from a study of volunteers and their reaction to different omelettes cooked in a variety of ways in order to train the robot. The omelette, made by the robotic chef'general tasted great – much better than expected' according to the research team who tested the resulting dish.
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.26)
- North America > United States > New York (0.05)
Researchers: Driverless cars can't just be safe. They also need to be nice.
Turns out, we may not actually want driverless cars to drive like us. That's according to researchers at the University of Michigan, who say they've found three core "personality" traits autonomous vehicles need to have to make people feel safer with them - even if they themselves don't have those same traits. Car makers already know some drivers still feel weird about the idea of sharing the road with an AV, much less actually using one. "That's partly because [autonomous vehicles] are designed purely from the technical perspectives, and those don't necessarily comply with our human social norms," says X. Jessie Yang, a professor in U of M's Department of Industrial and Operations Engineering, and the School of Information. "Like consider you're a pedestrian who wants to cross the street," she adds. "When you're interacting with a human driving a car, you'll have eye contact.
- Automobiles & Trucks (0.94)
- Transportation > Passenger (0.62)
- Transportation > Ground > Road (0.62)
- Information Technology > Robotics & Automation (0.62)
Speech-Gesture Mapping and Engagement Evaluation in Human Robot Interaction
Ghosh, Bishal, Dhall, Abhinav, Singla, Ekta
A robot needs contextual awareness, effective speech production and complementing non-verbal gestures for successful communication in society. In this paper, we present our end-to-end system that tries to enhance the effectiveness of non-verbal gestures. For achieving this, we identified prominently used gestures in performances by TED speakers and mapped them to their corresponding speech context and modulated speech based upon the attention of the listener. The proposed method utilized Convolutional Pose Machine [4] to detect the human gesture. Dominant gestures of TED speakers were used for learning the gesture-to-speech mapping. The speeches by them were used for training the model. We also evaluated the engagement of the robot with people by conducting a social survey. The effectiveness of the performance was monitored by the robot and it self-improvised its speech pattern on the basis of the attention level of the audience, which was calculated using visual feedback from the camera. The effectiveness of interaction as well as the decisions made during improvisation was further evaluated based on the head-pose detection and interaction survey.
- Research Report (0.64)
- Questionnaire & Opinion Survey (0.48)
Theoretical Concerns for the Integration of Repair
Trott, Sean (University of California, San Diego) | Rossano, Federico (University of California, San Diego)
Human conversation is messy. Speakers frequently repair their speech, and listeners must therefore integrate information across ill-formed, often fragmentary inputs. Previous dialogue systems for human-robot interaction (HRI) have addressed certain problems in dialogue repair, but there are many problems that remain. In this paper, we discuss these problems from the perspective of Conversation Analysis, and argue that a more holistic account of dialogue repair will actually aid in the design and implementation of machine dialogue systems.
Kar-robotic-pod-deliver-direct-door.html?ITO=1490&ns_mchannel=rss&ns_campaign=1490
Kar-go the robotic pod (artist's impression pictured) could soon be delivering packages direct to your front door, if the startup firm behind its creation can find funding to create a fleet of the vehicle Kar-Go uses state of the art artificial intelligence software to detect and manoeuvre around hazards. As the vehicle arrives at each delivery address, the system automatically selects the package belonging to the corresponding customer for delivery. The Academy of Robotics, has already gained permission from the UK government to test out a prototype of the vehicle (pictured) on public roads. As the vehicle arrives at each delivery address, the system automatically selects the package belonging to the corresponding customer.
- Transportation > Ground > Road (1.00)
- Information Technology > Robotics & Automation (1.00)
- Government > Regional Government > Europe Government > UK Government (0.95)
google-ai-photos-boring-recommend-remove-artificial-intelligence-a7773916.html
It's targeting dull things like screenshots and photos of documents, notes and receipts, and will hide them from your main picture stream. The giant human-like robot bears a striking resemblance to the military robots starring in the movie'Avatar' and is claimed as a world first by its creators from a South Korean robotic company Waseda University's saxophonist robot WAS-5, developed by professor Atsuo Takanishi and Kaptain Rock playing one string light saber guitar perform jam session A man looks at an exhibit entitled'Mimus' a giant industrial robot which has been reprogrammed to interact with humans during a photocall at the new Design Museum in South Kensington, London Electrification Guru Dr. Wolfgang Ziebart talks about the electric Jaguar I-PACE concept SUV before it was unveiled before the Los Angeles Auto Show in Los Angeles, California, U.S The Jaguar I-PACE Concept car is the start of a new era for Jaguar.
- Information Technology > Robotics & Automation (1.00)
- Automobiles & Trucks > Manufacturer (1.00)